Recognizing Implicit Discourse Relations via Repeated Reading: Neural Networks with Multi-Level Attention

نویسندگان

  • Yang Liu
  • Sujian Li
چکیده

Recognizing implicit discourse relations is a challenging but important task in the field of Natural Language Processing. For such a complex text processing task, different from previous studies, we argue that it is necessary to repeatedly read the arguments and dynamically exploit the efficient features useful for recognizing discourse relations. To mimic the repeated reading strategy, we propose the neural networks with multi-level attention (NNMA), combining the attention mechanism and external memories to gradually fix the attention on some specific words helpful to judging the discourse relations. Experiments on the PDTB dataset show that our proposed method achieves the state-ofart results. The visualization of the attention weights also illustrates the progress that our model observes the arguments on each level and progressively locates the important words.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Implicit Discourse Relation Classification via Multi-Task Neural Networks

Without discourse connectives, classifying implicit discourse relations is a challenging task and a bottleneck for building a practical discourse parser. Previous research usually makes use of one kind of discourse framework such as PDTB or RST to improve the classification performance on discourse relations. Actually, under different discourse annotation frameworks, there exist multiple corpor...

متن کامل

Learning better discourse representation for implicit discourse relation recognition via attention networks

Humans comprehend the meanings and relations of discourses heavily relying on their semantic memory that encodes general knowledge about concepts and facts. Inspired by this, we propose a neural recognizer for implicit discourse relation analysis, which builds upon a semantic memory that stores knowledge in a distributed fashion. We refer to this recognizer as SeMDER. Starting from word embeddi...

متن کامل

Multi-task Attention-based Neural Networks for Implicit Discourse Relationship Representation and Identification

We present a novel multi-task attentionbased neural network model to address implicit discourse relationship representation and identification through two types of representation learning, an attentionbased neural network for learning discourse relationship representation with two arguments and a multi-task framework for learning knowledge from annotated and unannotated corpora. The extensive e...

متن کامل

Predicting Implicit Discourse Relation with Multi-view Modeling and Effective Representation Learning

Discourse relations between two text segments play an important role in many natural language processing (NLP) tasks. The connectives strongly indicate the sense of discourse relations, while in fact, there are no connectives in a large proportion of discourse relations, i.e., implicit discourse relations. The key for implicit relation prediction is to correctly model the semantics of the two d...

متن کامل

Implicit Discourse Relation Recognition with Context-aware Character-enhanced Embeddings

For the task of implicit discourse relation recognition, traditional models utilizing manual features can suffer from data sparsity problem. Neural models provide a solution with distributed representations, which could encode the latent semantic information, and are suitable for recognizing semantic relations between argument pairs. However, conventional vector representations usually adopt em...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016